- entropy inequality
- 熵不等式
English-Chinese dictionary of mechanical engineering (英汉机械工程大词典). 2013.
English-Chinese dictionary of mechanical engineering (英汉机械工程大词典). 2013.
Entropy — This article is about entropy in thermodynamics. For entropy in information theory, see Entropy (information theory). For a comparison of entropy in information theory with entropy in thermodynamics, see Entropy in thermodynamics and information… … Wikipedia
Entropy power inequality — In mathematics, the entropy power inequality is a result in probability theory that relates to so called entropy power of random variables. It shows that the entropy power of suitably well behaved random variables is a superadditive function. The … Wikipedia
Entropy (information theory) — In information theory, entropy is a measure of the uncertainty associated with a random variable. The term by itself in this context usually refers to the Shannon entropy, which quantifies, in the sense of an expected value, the information… … Wikipedia
Clausius–Duhem inequality — Continuum mechanics … Wikipedia
Volume entropy — Among the various notions of entropy found in dynamical systems, differential geometry, and geometric group theory, an important role is played by the volume entropy.Let (M,g) be a closed surface with a Riemannian metric g . Denote by ( ilde{M},… … Wikipedia
Income inequality metrics — The concept of inequality is distinct from that of poverty[1] and fairness. Income inequality metrics or income distribution metrics are used by social scientists to measure the distribution of income, and economic inequality among the… … Wikipedia
Quantum relative entropy — In quantum information theory, quantum relative entropy is a measure of distinguishability between two quantum states. It is the quantum mechanical analog of relative entropy. Motivation For simplicity, it will be assumed that all objects in the… … Wikipedia
Gibbs' inequality — In information theory, Gibbs inequality is a statement about the mathematical entropy of a discrete probability distribution. Several other bounds on the entropy of probability distributions are derived from Gibbs inequality, including Fano s… … Wikipedia
Von Neumann entropy — In quantum statistical mechanics, von Neumann entropy refers to the extension of classical entropy concepts to the field of quantum mechanics.John von Neumann rigorously established the correct mathematical framework for quantum mechanics with… … Wikipedia
Joint entropy — The joint entropy is an entropy measure used in information theory. The joint entropy measures how much entropy is contained in a joint system of two random variables. If the random variables are X and Y, the joint entropy is written H(X,Y). Like … Wikipedia
Fano's inequality — In information theory, Fano s inequality (also known as the Fano converse and the Fano lemma) relates the average information lost in a noisy channel to the probability of the categorization error. It was derived by Robert Fano in the early 1950s … Wikipedia